Light field super-resolution using complementary-view feature attention

نویسندگان

چکیده

Abstract Light field (LF) cameras record multiple perspectives by a sparse sampling of real scenes, and these provide complementary information. This information is beneficial to LF super-resolution (LFSR). Compared with traditional single-image super-resolution, can exploit parallax structure perspective correlation among different views. Furthermore, the performance existing methods are limited as they fail deeply explore across In this paper, we propose novel network, called light complementary-view feature attention network (LF-CFANet), improve LFSR dynamically learning in Specifically, design residual spatial channel module (RCSCAM) effectively interact between Moreover, RCSCAM captures relationships channels, it able generate informative features for reconstructing images while ignoring redundant Then, maximum-difference supplementary branch (MDISB) used supplement from angular positions based on geometric images. also guide process reconstruction. Experimental results both synthetic real-world datasets demonstrate superiority our method. The proposed LF-CFANet has more advanced reconstruction that displays faithful details higher SR accuracy than state-of-the-art methods.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Light Field Super-Resolution Via Graph-Based Regularization

Light field cameras can capture the 3D information in a scene with a single shot. This special feature makes light field cameras very appealing for a variety of applications: from the popular post-capture refocus, to depth estimation and imagebased rendering. However, light field cameras suffer by design from strong limitations in their spatial resolution, which should therefore be augmented by...

متن کامل

Light field super resolution through controlled micro-shifts of light field sensor

Light field cameras presents new capabilities, such as post-capture refocusing and aperture control, through capturing directional and spatial distribution of light rays in space. Among different light field camera implementations, micro-lens array based light field cameras is a cost-effective and compact approach to capture light field. One drawback of the micro-lens array based light field ca...

متن کامل

Lensfree on-chip microscopy over a wide field-of-view using pixel super-resolution

We demonstrate lensfree holographic microscopy on a chip to achieve approximately 0.6 microm spatial resolution corresponding to a numerical aperture of approximately 0.5 over a large field-of-view of approximately 24 mm2. By using partially coherent illumination from a large aperture (approximately 50 microm), we acquire lower resolution lensfree in-line holograms of the objects with unit frin...

متن کامل

Image Sequence Super-resolution based on Learning using Feature Descriptors

There is currently a growing demand for high-resolution images and videos in several domains of knowledge, such as surveillance, remote sensing, medicine, industrial automation, microscopy, among others. High resolution images provide details that are important to tasks of analysis and visualization of data present in the images. However, due to the cost of high precision sensors and the limita...

متن کامل

Light Field Super-Resolution using a Low-Rank Prior and Deep Convolutional Neural Networks

Light field imaging has recently known a regain of interest due to the availability of practical light field capturing systems that offer a wide range of applications in the field of computer vision. However, capturing high-resolution light fields remains technologically challenging since the increase in angular resolution is often accompanied by a significant reduction in spatial resolution. T...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computational Visual Media

سال: 2023

ISSN: ['2096-0662', '2096-0433']

DOI: https://doi.org/10.1007/s41095-022-0297-1